Ablation Model Fineweb Edu
Apache-2.0
This model is part of the FineWeb ablation experiment, with 1.82 billion parameters, based on the Llama architecture, trained using the FineWeb-Edu dataset, and suitable for English text completion tasks.
Large Language Model
Transformers English